Attractor Neural Networks with Hypercolumns

نویسندگان

  • Christopher Johansson
  • Anders Sandberg
  • Anders Lansner
چکیده

We investigate attractor neural networks with a modular structure, where a local winner-takes-all rule acts within the modules (called hypercolumns). We make a signal-to-noise analysis of storage capacity and noise tolerance, and compare the results with those from simulations. Introducing local winner-takes-all dynamics improves storage capacity and noise tolerance, while the optimal size of the hypercolumns depends on network size and noise level.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Storage Capacity of an Abstract Cortical Model with Silent Hypercolumns

In this report we investigate the storage capacity of an abstract generic attractor neural network model of the mammalian cortex. This model network has a diluted connection matrix and a fixed activity level that is independent of network size. We develop an analytical model of the storage capacity for this type of networks when they are trained with both the Willshaw and Hopfield learning-rule...

متن کامل

Bistable, Irregular Firing and Population Oscillations in a Modular Attractor Memory Network

Attractor neural networks are thought to underlie working memory functions in the cerebral cortex. Several such models have been proposed that successfully reproduce firing properties of neurons recorded from monkeys performing working memory tasks. However, the regular temporal structure of spike trains in these models is often incompatible with experimental data. Here, we show that the in viv...

متن کامل

Self-Organisation of Hypercolumns based on Force-Directed Clustering

Attractor neural networks are often evaluated and found to perform well on sparse random patterns. However, real world data often have a large amount of correlation, which tends to decrease the performance of these neural networks dramatically. This thesis describes the first steps in the creation of a hidden layer for preprocessing data to address these problems. We describe a novel algorithm ...

متن کامل

بهبود بازشناسی مقاوم الگو در شبکه های عصبی بازگشتی جاذب از طریق به کارگیری دینامیک های آشوب گونه

In this paper, two kinds of chaotic neural networks are proposed to evaluate the efficiency of chaotic dynamics in robust pattern recognition. The First model is designed based on natural selection theory. In this model, attractor recurrent neural network, intelligently, guides the evaluation of chaotic nodes in order to obtain the best solution. In the second model, a different structure of ch...

متن کامل

A Parallel Implementation of a Bayesian Neural Network with Hypercolumns

A Bayesian Confidence Propagation Neural Network (BCPNN) with hypercolumns is implemented on modern general-purpose parallel computers. Two different parallel programming application program interfaces (APIs) are used, OpenMP and MPI. Hypercolumns is a concept derived from the local connectivity seen between neurons in cortex. The hypercolumns constitute a natural computational grain and enable...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002